Some Structural Complexity Aspects of Neural Computation
نویسندگان
چکیده
Recent work by Siegelmann and Sontag hod demonatrated that polynomial time on linear saturated recurrent neural networks equab polynomial time on standard computational models: !bring machines if the weights of the net are rationab, and nonuniform circuits if the weights are reab. Here we develop further connections between the languages recognized by such neural nets and other complen'ty classes. We present connections to spacebounded classes, simulation of parallel computational models such as Vector Machines, and a discussion of the charaderkations of various nonuniform classes in terms of Kolmogorov complezity.
منابع مشابه
روشی جدید برای اختفای خطا در فریمهای ویدئو با استفاده از شبکه عصبی RBF
Transmission of compressed video over error prone channels may result in packet losses, which can degrade the image quality. Error concealment (EC) is an effective approach to reduce the degradation caused by the missed information. The conventional temporal EC techniques are always inefficient when the motions of the video object are irregular. In this paper, in order to overcome this problem,...
متن کاملA New Approach for Investigating the Complexity of Short Term EEG Signal Based on Neural Network
Background and purpose: The nonlinear quality of electroencephalography (EEG), like other irregular signals, can be quantified. Some of these values, such as Lyapunovchr('39')s representative, study the signal path divergence and some quantifiers need to reconstruct the signal path but some do not. However, all of these quantifiers require a long signal to quantify the signal complexity. Mate...
متن کاملDiscrete versus Analog Computation: Aspects of Studying the Same Problem in Diierent Computational Models Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150
In this tutorial we want to outline some of the features coming up when analyzing the same computational problems in diierent complexity theoretic frameworks. We will focus on two problems; the rst related to mathematical optimization and the second dealing with the intrinsic structure of complexity classes. Both examples serve well for working out in how far diierent approaches to the same pro...
متن کاملData dimensionality reduction with application to simplifying RBF network structure and improving classification performance
For high dimensional data, if no preprocessing is carried out before inputting patterns to classifiers, the computation required may be too heavy. For example, the number of hidden units of a radial basis function (RBF) neural network can be too large. This is not suitable for some practical applications due to speed and memory constraints. In many cases, some attributes are not relevant to con...
متن کاملA Survey on Complexity of Integrity Parameter
Many graph theoretical parameters have been used to describe the vulnerability of communication networks, including toughness, binding number, rate of disruption, neighbor-connectivity, integrity, mean integrity, edgeconnectivity vector, l-connectivity and tenacity. In this paper we discuss Integrity and its properties in vulnerability calculation. The integrity of a graph G, I(G), is defined t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1993